ABSTRACT
In the Covid-19 age, we are becoming increasingly reliant on virtual interactions like as Zoom and Google meetings / Teams chat. The videos received from live webcamera in virtual interactions become great source for researchers to understand the human emotions. Due to the numerous applications in human-computer interaction, analysis of emotion from facial expressions has piqued the interest of the newest research community (HCI). The primary objective of this study is to assess various emotions using unique facial expressions captured via a live web camera. Traditional approaches (Conventional FER) rely on manual feature extraction before classifying the emotional state, whereas Deep Learning, Convolutional Neural Networks, and Transfer Learning are now widely used for emotional classification due to their advanced feature extraction mechanisms from images. In this implementation, we will use the most advanced deep learning models, MTCNN and VGG-16, to extract features and classify seven distinct emotions based on their facial landmarks in live video. Using the FER2013 standard dataset, we achieved a maximum accuracy of 97.23 percent for training and 60.2 percent for validation for emotion classification. © 2022 IEEE.